Lecture 15: Chernoff bounds and Sequential detection

ثبت نشده
چکیده

1 Chernoff Bounds 1.1 Bayesian Hypothesis Test A test using log-likelihood ratio statistic has the form, T (Y ) = logL(Y ) T τ. (1) Bound-1: The probability of error Pe is bounded as, Pe ≤ (π0 + π1e )eμT,0(s0)−s0τ , (2) where μT,0(s) = logE0[e ], and μ ′ T,0(s0) = τ . Bound-2: ∀ s ∈ [0, 1], Pe ≤ max(π0, π1e )eμT,0(s)−sτ . (3) Derivation of the above bound: Consider, Pe = π0P0(Γ1) + π1P1(Γ0), = π0 ∫

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture 2: Matrix Chernoff bounds

The purpose of my second and third lectures is to discuss spectral sparsifiers, which are the second key ingredient in most of the fast Laplacian solvers. In this lecture we will discuss concentration bounds for sums of random matrices, which are an important technical tool underlying the simplest sparsifier construction.

متن کامل

Lecture 5 — September 8 , 2016

1 Overview In the last lecture we took a more in depth look at Chernoff Bounds and introduced subgaussian and subexponential variables. In this lecture we will continue talking about subgaussian variables and related random variables – subexponential and subgamma, and finally we will give a proof of famous Johnson-Lindenstrauss lemma using property of subgaussian/subgamma variables. Definition ...

متن کامل

Lecturer : David P . Williamson Scribe : Faisal Alkaabneh

Today we will look at a matrix analog of the standard scalar Chernoff bounds. This matrix analog will be used in the next lecture when we talk about graph sparsification. While we’re more interested in the application of the theorem than its proof, it’s still useful to see the similarities and the differences of moving from the proof of the result for scalars to the same result for matrices. Sc...

متن کامل

CS 174 Lecture 10 John Canny

But we already saw that some random variables (e.g. the number of balls in a bin) fall off exponentially with distance from the mean. So Markov and Chebyshev are very poor bounds for those kinds of random variables. The Chernoff bound applies to a class of random variables and does give exponential fall-off of probability with distance from the mean. The critical condition that’s needed for a C...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016